Applying a Model





Kerry Back

Overview

  • Can save a trained model.
  • Load it and make predictions for new data.
  • Rank on predictions and trade (more later).
  • Important to interpret the model
    • What combinations of features does it like/dislike?
    • What industries does it like/dislike?

Repeat prior example

  • Use roeq and mom12m for 2021-01 as before.
  • Predict rnk as before.
  • Use a neural net with two hidden layers: 4 neurons in the first and 2 in the second.

Define, fit and save model

from sklearn.neural_network import MLPRegressor

X = data[["roeq", "mom12m"]]
y = data["rnk"]

model = MLPRegressor(
  hidden_layer_sizes=(4, 2),
  random_state=0
)
model.fit(X,y)

from joblib import dump, load
dump(model, "net-example.joblib")

Example of applying a model

  • As an example, treat 2021-02 as new data.
  • We’re at the beginning of 2021-02. We know the predictors but not the returns.
  • We apply the model to the predictors to predict returns.
  • We use the predicted returns to form a portfolio.

Get the 2021-02 data

new = pd.read_sql(
    """
    select ticker, date, ret, roeq, mom12m
    from data
    where date='2021-02'
    """, 
    conn
)
new = new.dropna()

Load and apply the model

model = load("net-example.joblib")

X = new[["roeq", "mom12m"]]

new["prnk"] = model.predict(X)

Order stocks by predictions

First stock is best, second is next best, etc.

new = new.sort_values(
  by="prnk",
  ascending=False
)
new.head(10)
ticker date ret roeq mom12m prnk
2016 AWH 2021-02 -0.223094 -0.722558 4.217742 0.938684
28 KSPN 2021-02 -0.172718 -1.526803 3.228324 0.933077
525 MARA 2021-02 0.453713 -0.681703 4.217742 0.931521
247 CRDF 2021-02 -0.123729 -0.671333 4.217742 0.929703
135 PACB 2021-02 -0.055023 -0.382200 4.217742 0.879009
2405 RIOT 2021-02 1.131579 -0.318611 4.217742 0.867860
710 ACRS 2021-02 0.073735 -0.198883 4.217742 0.846868
611 SUNW 2021-02 -0.256483 -0.153112 4.217742 0.838843
232 ENPH 2021-02 -0.034494 -0.149567 4.217742 0.838221
1390 FCEL 2021-02 -0.184008 -0.140428 4.217742 0.836619